Exact Decoding on Latent Variable Conditional Models is NP-Hard

نویسنده

  • Xu Sun
چکیده

Latent variable conditional models, including the latent conditional random fields as a special case, are popular models for many natural language processing and vision processing tasks. The computational complexity of the exact decoding/inference in latent conditional random fields is unclear. In this paper, we try to clarify the computational complexity of the exact decoding. We analyze the complexity and demonstrate that it is an NP-hard problem even on a sequential labeling setting. Furthermore, we propose the latent-dynamic inference (LDI-Naive) method and its bounded version (LDI-Bounded), which are able to perform exact-inference or almost-exactinference by using top-n search and dynamic programming.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential Labeling with Latent Variables: An Exact Inference Algorithm and its Efficient Approximation

Latent conditional models have become popular recently in both natural language processing and vision processing communities. However, establishing an effective and efficient inference method on latent conditional models remains a question. In this paper, we describe the latent-dynamic inference (LDI), which is able to produce the optimal label sequence on latent conditional models by using eff...

متن کامل

Spatial Latent Gaussian Models: Application to House Prices Data in Tehran City

Latent Gaussian models are flexible models that are applied in several statistical applications. When posterior marginals or full conditional distributions in hierarchical Bayesian inference from these models are not available in closed form, Markov chain Monte Carlo methods are implemented. The component dependence of the latent field usually causes increase in computational time and divergenc...

متن کامل

Leveraging the Exact Likelihood of Deep Latent Variable Models

Deep latent variable models combine the approximation abilities of deep neural networks and the statistical foundations of generative models. The induced data distribution is an infinite mixture model whose density is extremely delicate to compute. Variational methods are consequently used for inference, following the seminal work of Rezende et al. (2014) and Kingma & Welling (2014). We study t...

متن کامل

Tractable structure search in the presence of latent variables

The problem of learning the structure of a DAG model in the presence of latent variables presents many formidable challenges. In particular there are an innnite number of latent variable models to consider, and these models possess features which make them hard to work with. We describe a class of graphical models which can represent the conditional independence structure induced by a latent va...

متن کامل

A Discriminative Latent Variable Model for Statistical Machine Translation

Large-scale discriminative machine translation promises to further the state-of-the-art, but has failed to deliver convincing gains over current heuristic frequency count systems. We argue that a principle reason for this failure is not dealing with multiple, equivalent translations. We present a translation model which models derivations as a latent variable, in both training and decoding, and...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1406.4682  شماره 

صفحات  -

تاریخ انتشار 2014